Midterm II. Introduction to Artificial Intelligence. CS 188 Spring ˆ You have approximately 1 hour and 50 minutes.
|
|
- Jeffery Bell
- 5 years ago
- Views:
Transcription
1 CS 188 Spring 2013 Introduction to Artificial Intelligence Midterm II ˆ You have approximately 1 hour and 50 minutes. ˆ The exam is closed book, closed notes except a one-page crib sheet. ˆ Please use non-programmable calculators only. ˆ Mark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation. All short answer sections can be successfully answered in a few sentences AT MOST. First name Last name SID EdX username First and last name of student to your left First and last name of student to your right For staff use only: Q1. Bayes Nets Representation /17 Q2. Bayes Net Reasoning /12 Q3. Variable Elimination /21 Q4. Bayes Net Sampling /14 Q5. Probability, Bayes Nets and Decision Networks /28 Q6. Perceptron /8 Total /100 1
2 THIS PAGE IS INTENTIONALLY LEFT BLANK
3 Q1. [17 pts] Bayes Nets Representation (a) [8 pts] Graph Structure: Conditional Independence Consider the Bayes net given below. A B C D E F G H I Remember that X Y reads as X is independent of Y given nothing, and X Y {Z, W } reads as X is independent of Y given Z and W. For each expression, fill in the corresponding circle to indicate whether it is True or False. (i) True False It is guaranteed that A B An active path: A B. (ii) True False It is guaranteed that A C No active paths. (iii) True False It is guaranteed that A D E An active path: A B E (observed) D. (iv) True False It is guaranteed that A I E An active path: A B E(observed) D G H I. (v) True False It is guaranteed that B C I An active path: B E (descendent I observed) F C. (vi) True False It is guaranteed that F A H An active path: F E (descendent H observed) B A. (vii) True False It is guaranteed that D I {E, G} No active paths. (viii) True False It is guaranteed that C H G An active path: C F E H. 3
4 (b) Marginalization and Conditioning Consider a Bayes net over the random variables A, B, C, D, E with the structure shown below, with full joint distribution P (A, B, C, D, E). The following three questions describe different, unrelated situations (your answers to one question should not influence your answer to other questions). A B C D E (i) [3 pts] Consider the marginal distribution P (A, B, C, E) = d P (A, B, C, d, E), where D was eliminated. On the diagram below, draw the minimal number of arrows that results in a Bayes net structure that is able to represent this marginal distribution. If no arrows are needed write No arrows needed. B A C E Multiple solutions exist each solution has exactly the same set of conditional independence assumptions as the graph shown. These other solutions have the same set of edges, but the directionality could be different in such a way that every triple has the same active/inactive properties as above. Concretely, could change directionality of A C, of A B, but not of both A C and A B at the same time. (ii) [3 pts] Assume we are given an observation: A = a. On the diagram below, draw the minimal number of arrows that results in a Bayes net structure that is able to represent the conditional distribution P (B, C, D, E A = a). If no arrows are needed write No arrows needed. B D C E Only one solution exists for this question. The solution needs to have all edges in the original graph, and then additional edges as needed to ensure the same nodes are connected by active paths (for when A is observed). A observed doesn t activate any paths, in fact all paths through A are inactive with A observed, so no additional edges needed. (iii) [3 pts] Assume we are given an observations: D = d. On the diagram below, draw the minimal number of arrows that results in a Bayes net structure that is able to represent the conditional distribution P (A, B, C, E D = d). If no arrows are needed write No arrows needed. A B C Multiple solutions exist. The most natural choice is the one shown on the left. Other solutions have the same set of conditional independence assumptions as the graph shown on the left. E 4
5 Q2. [12 pts] Bayes Net Reasoning P (A D, X) +d +x +a 0.9 +d +x a 0.1 +d x +a 0.8 +d x a 0.2 d +x +a 0.6 d +x a 0.4 d x +a 0.1 d x a 0.9 P (D) +d 0.1 d 0.9 P (X D) +d +x 0.7 +d x 0.3 d +x 0.8 d x 0.2 P (B D) +d +b 0.7 +d b 0.3 d +b 0.5 d b 0.5 (a) [3 pts] What is the probability of having disease D and getting a positive result on test A? P (+d, +a) = x P (+d, x, +a) = x P (+a + d, x)p (x + d)p (+d) = P (+d) x P (+a + d, x)p (x + d) = (0.1)((0.9)(0.7) + (0.8)(0.3)) = (b) [3 pts] What is the probability of not having disease D and getting a positive result on test A? P ( d, +a) = x P ( d, x, +a) = x P (+a d, x)p (x d)p ( d) = P ( d) x P (+a d, x)p (x d) = (0.9)((0.6)(0.8) + (0.1)(0.2)) = 0.45 (c) [3 pts] What is the probability of having disease D given a positive result on test A? P (+d + a) = P (+a,+d) P (+a) = P (+a,+d) d P (+a,d) = (d) [3 pts] What is the probability of having disease D given a positive result on test B? P (+d + b) = P (+b +d)p (+d) P (+b) = P (+b +d)p (+d) d P (+b d)p (d) = (0.7)(0.1) (0.7)(0.1)+(0.5)(0.9)
6 Q3. [21 pts] Variable Elimination (a) [9 pts] For the Bayes net below, we are given the query P (A, E +c). All variables have binary domains. Assume we run variable elimination to compute the answer to this query, with the following variable elimination ordering: B, D, G, F. Complete the following description of the factors generated in this process: After inserting evidence, we have the following factors to start out with: P (A), P (B A), P (+c), P (D A, B, +c), P (E D), P (F D), P (G + c, F ) When eliminating B we generate a new factor f 1 as follows: f 1 (A, +c, D) = b P (b A)P (D A, b, +c) This leaves us with the factors: P (A), P (+c), P (E D), P (F D), P (G + c, F ), f 1 (A, +c, D) When eliminating D we generate a new factor f 2 as follows: f 2 (A, +c, E, F ) = d P (E d)p (F d)f 1 (A, +c, d) This leaves us with the factors: P (A), P (+c), P (G + c, F ), f 2 (A, +c, E, F ) When eliminating G we generate a new factor f 3 as follows: f 3 (+c, F ) = g P (g + c, F ) This leaves us with the factors: P (A), P (+c), f 2 (A, +c, E, F ), f 3 (+c, F ) 6
7 Let s make sure to account for error propagation in our grading of this one. When eliminating F we generate a new factor f 4 as follows: f 4 (A, +c, E) = f f 2 (A, +c, E, f)f 3 (+c, f) This leaves us with the factors: P (A), P (+c), f 4 (A, +c, E) (b) [2 pts] Write a formula to compute P (A, E +c) from the remaining factors. P (A, E +c) = P (A)P (+c)f4(a,+c,e) a,e P (a)p (+c)f4(a,+c,e) or alternatively: P (A, E +c) P (A)P (+c)f 4(A, +c, E) and include statement that says renormalization is needed to obtain P (A, E +c). (c) [2 pts] Among f 1, f 2, f 3, f 4, which is the largest factor generated, and how large is it? Assume all variables have binary domains and measure the size of each factor by the number of rows in the table that would represent the factor. f 2 (A, +c, E, F ) is the largest factor generated. It has 3 non-instantiated variables, hence 2 3 = 8 entries. (d) [8 pts] Find a variable elimination ordering for the same query, i.e., for P (A, E +c), for which the maximum size factor generated along the way is smallest. Hint: the maximum size factor generated in your solution should have only 2 variables, for a size of 2 2 = 4 table. Fill in the variable elimination ordering and the factors generated into the table below. Variable Eliminated Factor Generated B f 1 (A, +c, D) G f 2 (+c, F ) F f 3 (+c, D) D f 4 (A, +c, E) For example, in the naive ordering we used earlier, the first row in this table would have had the following two entries: B, f 1 (A, +c, D). Note: multiple orderings are possible. An ordering is good if it eliminates all non-query variables (B, D, F, G) and its largest factor has only two variables. 7
8 Q4. [14 pts] Bayes Net Sampling Assume you are given the following Bayes net and the corresponding distributions over the variables in the Bayes net. P (C A, B) +c +a +b.25 -c +a +b.75 P (D C) P (A) P (B) +c -a +b.6 +d +c.5 A +a 0.1 +b.7 -c -a +b.4 -d +c.5 -a 0.9 -b.3 +c +a -b.5 +d -c.8 C D -c +a -b.5 -d -c.2 +c -a -b.2 -c -a -b.8 B (a) [2 pts] Assume we receive evidence that A = +a. If we were to draw samples using rejection sampling, on expectation what percentage of the samples will be rejected? Since P (+a) = 1 10, we would expect that only 10% of the samples could be saved. Therefore, expected 90% of the samples will be rejected. (b) [6 pts] Next, assume we observed both A = +a and D = +d. What are the weights for the following samples under likelihood weighting sampling? Sample Weight (+a, b, +c, +d) P (+a) P (+d + c) = = 0.05 (+a, b, c, +d) P (+a) P (+d c) = = 0.08 (+a, +b, c, +d) P (+a) P (+d c) = = 0.08 (c) [2 pts] Given the samples in the previous question, estimate P ( b + a, +d). P ( b + a, +d) = P (+a) P (+d + c) + P (+a) P (+d c) = P (+a) P (+d + c) + 2 P (+a) P (+d c) = (d) [4 pts] Assume we need to (approximately) answer two different inference queries for this graph: P (C + a) and P (C + d). You are required to answer one query using likelihood weighting and one query using Gibbs sampling. In each case you can only collect a relatively small amount of samples, so for maximal accuracy you need to make sure you cleverly assign algorithm to query based on how well the algorithm fits the query. Which query would you answer with each algorithm? Algorithm Query Algorithm Query Likelihood Weighting P (C + a) Gibbs Sampling P (C + d) Justify your answer: You should use Gibbs sampling to find the query answer P (C + d). This is because likelihood weighting only takes upstream evidence into account when sampling. Therefore, Gibbs, which utilizes both upstream and downstream evidence, is more suited to the query P (C + d) which has downstream evidence. 8
9 Q5. [28 pts] Probability, Bayes Nets and Decision Networks It is Monday night, and Bob is finishing up preparing for the CS188 Midterm II that is coming up on Tuesday. Bob has already mastered all the topics except one: Decision Networks. He is contemplating whether to spend the remainder of his evening reviewing that topic (review), or just go to sleep (sleep). Decision Networks are either going to be on the test (+d) or not be on the test ( d). His utility of satisfaction is only affected by these two variables as shown below: (a) [5 pts] Maximum Expected Utility Compute the following quantities: D P(D) +d 0.5 -d 0.5 D A U(D,A) +d review d review 600 +d sleep 0 -d sleep 1500 EU(review) = P (+d)u(+d, review) + P ( d)u( d, review) = = 800 EU(sleep) = P (+d)u(+d, sleep) + P ( d)u( d, sleep) = = 750 MEU({}) = max(800, 750) = 800 Action that achieves MEU({}) = review This result notwithstanding, you should get some sleep. 9
10 (b) [11 pts] The TA is on Facebook The TAs happiness (H) is affected by whether decision networks are going to be on the exam. The happiness (H) determines whether the TA posts on Facebook (+f) or doesn t post on Facebook ( f). The prior on D and utility tables remain unchanged. F H P (F H) +f +h 0.6 -f +h 0.4 +f -h 0.2 -f -h 0.8 H D P (H D) +h +d h +d h -d h -d 0.75 D P(D) +d 0.5 -d 0.5 D A U(D,A) +d review d review 600 +d sleep 0 -d sleep 1500 Decision network. Tables that define the model are shown above. H P (H) +h 0.6 -h 0.4 F P (F ) +f f 0.56 D F P (D F ) +d +f d +f d -f d -f F D P (F D) +f +d f +d f -d f -d D H P (D H) +d +h d +h d -h d -h 0.94 Tables computed from the first set of tables. Some of them might be convenient to answer the questions below. Compute the following quantities: EU(review +f) = P (+d +f)u(+d, review)+p ( d +f)u( d, review) = = = EU(sleep + f) = P (+d + f)u(+d, sleep) + P ( d + f)u( d, sleep) = = 501 MEU({+f}) = max(866.4, 501) = Optimal Action({+f}) = review EU(review f) = P (+d f)u(+d, review)+p ( d f)u( d, review) = = = 748 EU(sleep f) = P (+d f)u(+d, sleep) + P ( d f)u( d, sleep) = = = 945 MEU({ f}) = max(748, 945) = 945 Optimal Action({ f}) = sleep V P I({F }) = P (+f)meu({+f}) + P ( f)meu({ f}) MEU({}) = =
11 (c) VPI Comparisons Now consider the case where there are n TAs. Each TA follows the same probabilistic models for happiness (H) and posting on Facebook (F ) as in the previous question. (i) [3 pts] True False V P I(H 1 F 1 ) = 0 Justify: F 1 is just a noisy version of H 1. Hence finding out H 1 gives us more information about D even when we have already observed F 1. This in turn will allow us to more often make the right decision between sleep and review. (ii) [3 pts] True False V P I(F 1 H 1 ) = 0 Justify:The parent variable of the utility node, D, is conditionally independent of F 1 given H 1. (iii) [3 pts] True False V P I(F 3 F 2, F 1 ) > V P I(F 2 F 1 ) Justify:The F i variables give us noisy information about D. The more F i variables we get to observe, the better chance we end up being able to make the right decision. The more F i variables we have already observed, however, the less an additiona observation of a new variable F j will influence the distribution of D. (iv) [3 pts] True False V P I(F 1, F 2,..., F n ) < V P I(H 1, H 2,..., H n ) Justify:The F i variables are noisy versions of the H i variables, hence observing the H i variables is more valuable. 11
12 Q6. [8 pts] Perceptron You have decided to become a teacher. The only issue is that you don t want to spend lots of time grading essays, so instead you decide to grade them all with a linear classifier. Your classifier considers the number of 7-letter (f 7 ) and 8-letter words (f 8 ) in an essay and then assigns a grade, either A or F, based on those two numbers. You have four graded essays to learn from: BIAS f 7 f 8 grade A (+) F (-) A (+) F (-) (a) [2 pts] You decide to run perceptron and being optimistic about the students essay writing capabilities, you decide to initialize your weight vector as (1, 0, 0). If the score from your classifier is greater than 0, it gives an A, if it is 0 or lower, it gives an F. Fill in the resulting weight vector after having seen the first training example and after having seen the second training example. Use the perceptron update rule. BIAS f 7 f 8 Initial After first training example After second training example (b) [2 pts] True False The training data is linearly separable with the given features. Justify:One justification is to draw the points in the 2-D plane, and show that a linear decision boundary separates the classes. Another justification is to provide a weight vector w that classifies all data points correctly, w = ( 2.5, 1, 1) is such a weight vector. (c) [4 pts] For each of the following decision rules, indicate whether there is a weight vector that represents the decision rule. If Yes then include such a weight vector. 1. A paper gets an A if and only if it satisfies (f 7 + f 8 7). Yes w =( 6.5, 1, 1) No 2. A paper gets an A if and only if it satisfies (f 7 5 AND f 8 4). Yes w = No 3. A paper gets an A if and only if it satisfies (f 7 5 OR f 8 4). Yes w = No 4. A paper gets an A if and only if it has between 4 and 6, inclusive, 7-letter words and between 3 and 5 8-letter words. Yes w = No 12
Midterm II. Introduction to Artificial Intelligence. CS 188 Spring ˆ You have approximately 1 hour and 50 minutes.
CS 188 Spring 2013 Introduction to Artificial Intelligence Midterm II ˆ You have approximately 1 hour and 50 minutes. ˆ The exam is closed book, closed notes except a one-page crib sheet. ˆ Please use
More informationMidterm II. Introduction to Artificial Intelligence. CS 188 Fall ˆ You have approximately 3 hours.
CS 188 Fall 2012 Introduction to Artificial Intelligence Midterm II ˆ You have approximately 3 hours. ˆ The exam is closed book, closed notes except a one-page crib sheet. ˆ Please use non-programmable
More informationMidterm 2 V1. Introduction to Artificial Intelligence. CS 188 Spring 2015
S 88 Spring 205 Introduction to rtificial Intelligence Midterm 2 V ˆ You have approximately 2 hours and 50 minutes. ˆ The exam is closed book, closed calculator, and closed notes except your one-page crib
More informationˆ The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.
CS 188 Summer 2015 Introduction to Artificial Intelligence Midterm 2 ˆ You have approximately 80 minutes. ˆ The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.
More informationMidterm II. Introduction to Artificial Intelligence. CS 188 Fall You have approximately 3 hours.
CS 188 Fall 2012 Introduction to Artificial Intelligence Midterm II You have approximately 3 hours. The exam is closed book, closed notes except a one-page crib sheet. Please use non-programmable calculators
More informationFinal. Introduction to Artificial Intelligence. CS 188 Spring You have approximately 2 hours and 50 minutes.
CS 188 Spring 2014 Introduction to Artificial Intelligence Final You have approximately 2 hours and 50 minutes. The exam is closed book, closed notes except your two-page crib sheet. Mark your answers
More informationCS 188 Fall Introduction to Artificial Intelligence Midterm 2
CS 188 Fall 2013 Introduction to rtificial Intelligence Midterm 2 ˆ You have approximately 2 hours and 50 minutes. ˆ The exam is closed book, closed notes except your one-page crib sheet. ˆ Please use
More informationThe exam is closed book, closed calculator, and closed notes except your one-page crib sheet.
CS 188 Fall 2018 Introduction to Artificial Intelligence Practice Final You have approximately 2 hours 50 minutes. The exam is closed book, closed calculator, and closed notes except your one-page crib
More informationIntroduction to Artificial Intelligence Midterm 2. CS 188 Spring You have approximately 2 hours and 50 minutes.
CS 188 Spring 2014 Introduction to Artificial Intelligence Midterm 2 You have approximately 2 hours and 50 minutes. The exam is closed book, closed notes except your two-page crib sheet. Mark your answers
More informationIntroduction to Fall 2009 Artificial Intelligence Final Exam
CS 188 Introduction to Fall 2009 Artificial Intelligence Final Exam INSTRUCTIONS You have 3 hours. The exam is closed book, closed notes except a two-page crib sheet. Please use non-programmable calculators
More informationThe exam is closed book, closed calculator, and closed notes except your one-page crib sheet.
CS 188 Fall 2015 Introduction to Artificial Intelligence Final You have approximately 2 hours and 50 minutes. The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.
More informationMachine Learning, Midterm Exam: Spring 2009 SOLUTION
10-601 Machine Learning, Midterm Exam: Spring 2009 SOLUTION March 4, 2009 Please put your name at the top of the table below. If you need more room to work out your answer to a question, use the back of
More informationIntroduction to Spring 2009 Artificial Intelligence Midterm Exam
S 188 Introduction to Spring 009 rtificial Intelligence Midterm Exam INSTRUTINS You have 3 hours. The exam is closed book, closed notes except a one-page crib sheet. Please use non-programmable calculators
More informationIntroduction to Machine Learning Midterm, Tues April 8
Introduction to Machine Learning 10-701 Midterm, Tues April 8 [1 point] Name: Andrew ID: Instructions: You are allowed a (two-sided) sheet of notes. Exam ends at 2:45pm Take a deep breath and don t spend
More informationCS 188 Fall Introduction to Artificial Intelligence Midterm 2
CS 188 Fall 2013 Introduction to rtificial Intelligence Midterm 2 ˆ You have approximately 2 hours and 50 minutes. ˆ The exam is closed book, closed notes except your one-page crib sheet. ˆ Please use
More informationMidterm. Introduction to Machine Learning. CS 189 Spring You have 1 hour 20 minutes for the exam.
CS 189 Spring 2013 Introduction to Machine Learning Midterm You have 1 hour 20 minutes for the exam. The exam is closed book, closed notes except your one-page crib sheet. Please use non-programmable calculators
More informationBayes Nets III: Inference
1 Hal Daumé III (me@hal3.name) Bayes Nets III: Inference Hal Daumé III Computer Science University of Maryland me@hal3.name CS 421: Introduction to Artificial Intelligence 10 Apr 2012 Many slides courtesy
More informationMark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation.
CS 189 Spring 2015 Introduction to Machine Learning Midterm You have 80 minutes for the exam. The exam is closed book, closed notes except your one-page crib sheet. No calculators or electronic items.
More informationCS 188 Introduction to Fall 2007 Artificial Intelligence Midterm
NAME: SID#: Login: Sec: 1 CS 188 Introduction to Fall 2007 Artificial Intelligence Midterm You have 80 minutes. The exam is closed book, closed notes except a one-page crib sheet, basic calculators only.
More informationMIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,
MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October, 23 2013 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run
More informationThe exam is closed book, closed calculator, and closed notes except your one-page crib sheet.
CS 88 Fall 208 Introduction to Artificial Intelligence Practice Final You have approximately 2 hours 50 minutes. The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.
More informationMachine Learning, Fall 2009: Midterm
10-601 Machine Learning, Fall 009: Midterm Monday, November nd hours 1. Personal info: Name: Andrew account: E-mail address:. You are permitted two pages of notes and a calculator. Please turn off all
More informationMachine Learning, Midterm Exam: Spring 2008 SOLUTIONS. Q Topic Max. Score Score. 1 Short answer questions 20.
10-601 Machine Learning, Midterm Exam: Spring 2008 Please put your name on this cover sheet If you need more room to work out your answer to a question, use the back of the page and clearly mark on the
More informationFinal Exam December 12, 2017
Introduction to Artificial Intelligence CSE 473, Autumn 2017 Dieter Fox Final Exam December 12, 2017 Directions This exam has 7 problems with 111 points shown in the table below, and you have 110 minutes
More informationThe exam is closed book, closed calculator, and closed notes except your one-page crib sheet.
CS 188 Spring 2017 Introduction to Artificial Intelligence Midterm V2 You have approximately 80 minutes. The exam is closed book, closed calculator, and closed notes except your one-page crib sheet. Mark
More informationThe exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet.
CS 189 Spring 013 Introduction to Machine Learning Final You have 3 hours for the exam. The exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet. Please
More informationIntroduction to Spring 2006 Artificial Intelligence Practice Final
NAME: SID#: Login: Sec: 1 CS 188 Introduction to Spring 2006 Artificial Intelligence Practice Final You have 180 minutes. The exam is open-book, open-notes, no electronics other than basic calculators.
More informationMidterm, Fall 2003
5-78 Midterm, Fall 2003 YOUR ANDREW USERID IN CAPITAL LETTERS: YOUR NAME: There are 9 questions. The ninth may be more time-consuming and is worth only three points, so do not attempt 9 unless you are
More informationQualifier: CS 6375 Machine Learning Spring 2015
Qualifier: CS 6375 Machine Learning Spring 2015 The exam is closed book. You are allowed to use two double-sided cheat sheets and a calculator. If you run out of room for an answer, use an additional sheet
More informationFinal Exam December 12, 2017
Introduction to Artificial Intelligence CSE 473, Autumn 2017 Dieter Fox Final Exam December 12, 2017 Directions This exam has 7 problems with 111 points shown in the table below, and you have 110 minutes
More informationMidterm: CS 6375 Spring 2018
Midterm: CS 6375 Spring 2018 The exam is closed book (1 cheat sheet allowed). Answer the questions in the spaces provided on the question sheets. If you run out of room for an answer, use an additional
More informationAnnouncements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic
CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return
More informationMachine Learning, Midterm Exam
10-601 Machine Learning, Midterm Exam Instructors: Tom Mitchell, Ziv Bar-Joseph Wednesday 12 th December, 2012 There are 9 questions, for a total of 100 points. This exam has 20 pages, make sure you have
More informationFinal Examination CS540-2: Introduction to Artificial Intelligence
Final Examination CS540-2: Introduction to Artificial Intelligence May 9, 2018 LAST NAME: SOLUTIONS FIRST NAME: Directions 1. This exam contains 33 questions worth a total of 100 points 2. Fill in your
More informationFINAL: CS 6375 (Machine Learning) Fall 2014
FINAL: CS 6375 (Machine Learning) Fall 2014 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run out of room for
More informationMachine Learning Practice Page 2 of 2 10/28/13
Machine Learning 10-701 Practice Page 2 of 2 10/28/13 1. True or False Please give an explanation for your answer, this is worth 1 pt/question. (a) (2 points) No classifier can do better than a naive Bayes
More informationAnnouncements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation
CS 188: Artificial Intelligence Spring 2010 Lecture 15: Bayes Nets II Independence 3/9/2010 Pieter Abbeel UC Berkeley Many slides over the course adapted from Dan Klein, Stuart Russell, Andrew Moore Current
More informationMidterm. Introduction to Artificial Intelligence. CS 188 Summer You have approximately 2 hours and 50 minutes.
CS 188 Summer 2014 Introduction to Artificial Intelligence Midterm You have approximately 2 hours and 50 minutes. The exam is closed book, closed notes except your one-page crib sheet. Mark your answers
More informationMidterm: CS 6375 Spring 2015 Solutions
Midterm: CS 6375 Spring 2015 Solutions The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run out of room for an
More informationMidterm Exam, Spring 2005
10-701 Midterm Exam, Spring 2005 1. Write your name and your email address below. Name: Email address: 2. There should be 15 numbered pages in this exam (including this cover sheet). 3. Write your name
More informationFinal. Introduction to Artificial Intelligence. CS 188 Spring You have approximately 2 hours and 50 minutes.
CS 188 Spring 2013 Introduction to Artificial Intelligence Final You have approximately 2 hours and 50 minutes. The exam is closed book, closed notes except a three-page crib sheet. Please use non-programmable
More informationDepartment of Computer Science and Engineering CSE 151 University of California, San Diego Fall Midterm Examination
Department of Computer Science and Engineering CSE 151 University of California, San Diego Fall 2008 Your name: Midterm Examination Tuesday October 28, 9:30am to 10:50am Instructions: Look through the
More informationFinal Examination CS 540-2: Introduction to Artificial Intelligence
Final Examination CS 540-2: Introduction to Artificial Intelligence May 7, 2017 LAST NAME: SOLUTIONS FIRST NAME: Problem Score Max Score 1 14 2 10 3 6 4 10 5 11 6 9 7 8 9 10 8 12 12 8 Total 100 1 of 11
More informationCSE 546 Final Exam, Autumn 2013
CSE 546 Final Exam, Autumn 0. Personal info: Name: Student ID: E-mail address:. There should be 5 numbered pages in this exam (including this cover sheet).. You can use any material you brought: any book,
More informationBayesian Networks BY: MOHAMAD ALSABBAGH
Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional
More informationCS540 ANSWER SHEET
CS540 ANSWER SHEET Name Email 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 1 2 Final Examination CS540-1: Introduction to Artificial Intelligence Fall 2016 20 questions, 5 points
More informationAnnouncements. Midterm Review Session. Extended TA session on Friday 9am to 12 noon.
Announcements Extended TA session on Friday 9am to 12 noon. Midterm Review Session The midterm prep set indicates the difficulty level of the midterm and the type of questions that can be asked. Please
More information1. A Not So Random Walk
CS88 Fall 208 Section 9: Midterm 2 Prep. A Not So Random Walk Pacman is trying to predict the position of a ghost, which he knows has the following transition graph: p q p q A B C Here, 0 < p < and 0
More informationIntroduction to Fall 2008 Artificial Intelligence Midterm Exam
CS 188 Introduction to Fall 2008 Artificial Intelligence Midterm Exam INSTRUCTIONS You have 80 minutes. 70 points total. Don t panic! The exam is closed book, closed notes except a one-page crib sheet,
More informationIntroduction to Fall 2008 Artificial Intelligence Final Exam
CS 188 Introduction to Fall 2008 Artificial Intelligence Final Exam INSTRUCTIONS You have 180 minutes. 100 points total. Don t panic! The exam is closed book, closed notes except a two-page crib sheet,
More informationMIDTERM SOLUTIONS: FALL 2012 CS 6375 INSTRUCTOR: VIBHAV GOGATE
MIDTERM SOLUTIONS: FALL 2012 CS 6375 INSTRUCTOR: VIBHAV GOGATE March 28, 2012 The exam is closed book. You are allowed a double sided one page cheat sheet. Answer the questions in the spaces provided on
More informationCS 188: Artificial Intelligence Spring Announcements
CS 188: Artificial Intelligence Spring 2011 Lecture 12: Probability 3/2/2011 Pieter Abbeel UC Berkeley Many slides adapted from Dan Klein. 1 Announcements P3 due on Monday (3/7) at 4:59pm W3 going out
More informationFinal Exam, Spring 2006
070 Final Exam, Spring 2006. Write your name and your email address below. Name: Andrew account: 2. There should be 22 numbered pages in this exam (including this cover sheet). 3. You may use any and all
More informationCS 343: Artificial Intelligence
CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More information- - - - - - - - - - - - - - - - - - DISCLAIMER - - - - - - - - - - - - - - - - - - General Information: This midterm is a sample midterm. This means: The sample midterm contains problems that are of similar,
More informationAnnouncements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.
Introduction to Artificial Intelligence V22.0472-001 Fall 2009 Lecture 15: Bayes Nets 3 Midterms graded Assignment 2 graded Announcements Rob Fergus Dept of Computer Science, Courant Institute, NYU Slides
More informationMidterm. You may use a calculator, but not any device that can access the Internet or store large amounts of data.
INST 737 April 1, 2013 Midterm Name: }{{} by writing my name I swear by the honor code Read all of the following information before starting the exam: For free response questions, show all work, clearly
More informationMidterm. Introduction to Machine Learning. CS 189 Spring Please do not open the exam before you are instructed to do so.
CS 89 Spring 07 Introduction to Machine Learning Midterm Please do not open the exam before you are instructed to do so. The exam is closed book, closed notes except your one-page cheat sheet. Electronic
More informationFinal. CS 188 Fall Introduction to Artificial Intelligence
CS 188 Fall 2012 Introduction to Artificial Intelligence Final You have approximately 3 hours. The exam is closed book, closed notes except your three one-page crib sheets. Please use non-programmable
More informationName (NetID): (1 Point)
CS446: Machine Learning (D) Spring 2017 March 16 th, 2017 This is a closed book exam. Everything you need in order to solve the problems is supplied in the body of this exam. This exam booklet contains
More informationCS 188: Artificial Intelligence. Bayes Nets
CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew
More information1 [15 points] Search Strategies
Probabilistic Foundations of Artificial Intelligence Final Exam Date: 29 January 2013 Time limit: 120 minutes Number of pages: 12 You can use the back of the pages if you run out of space. strictly forbidden.
More informationMath 51 Midterm 1 July 6, 2016
Math 51 Midterm 1 July 6, 2016 Name: SUID#: Circle your section: Section 01 Section 02 (1:30-2:50PM) (3:00-4:20PM) Complete the following problems. In order to receive full credit, please show all of your
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science ALGORITHMS FOR INFERENCE Fall 2014
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.438 ALGORITHMS FOR INFERENCE Fall 2014 Quiz 2 Wednesday, December 10, 2014 7:00pm 10:00pm This is a closed
More informationDepartment of Computer and Information Science and Engineering. CAP4770/CAP5771 Fall Midterm Exam. Instructor: Prof.
Department of Computer and Information Science and Engineering UNIVERSITY OF FLORIDA CAP4770/CAP5771 Fall 2016 Midterm Exam Instructor: Prof. Daisy Zhe Wang This is a in-class, closed-book exam. This exam
More informationExam 2 MAS 3105 Applied Linear Algebra, Spring 2018
Exam 2 MAS 3105 Applied Linear Algebra, Spring 2018 (Clearly!) Print Name: Mar 8, 2018 Read all of what follows carefully before starting! 1. This test has 6 problems and is worth 100 points. Please be
More informationCS 5522: Artificial Intelligence II
CS 5522: Artificial Intelligence II Bayes Nets: Independence Instructor: Alan Ritter Ohio State University [These slides were adapted from CS188 Intro to AI at UC Berkeley. All materials available at http://ai.berkeley.edu.]
More informationProbabilistic Graphical Models for Image Analysis - Lecture 1
Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.
More informationName (NetID): (1 Point)
CS446: Machine Learning Fall 2016 October 25 th, 2016 This is a closed book exam. Everything you need in order to solve the problems is supplied in the body of this exam. This exam booklet contains four
More informationCS188: Artificial Intelligence, Fall 2010 Written 3: Bayes Nets, VPI, and HMMs
CS188: Artificial Intelligence, Fall 2010 Written 3: Bayes Nets, VPI, and HMMs Due: Tuesday 11/23 in 283 Soda Drop Box by 11:59pm (no slip days) Policy: Can be solved in groups (acknowledge collaborators)
More informationFinal. CS 188 Fall Introduction to Artificial Intelligence
S 188 Fall 2012 Introduction to rtificial Intelligence Final You have approximately 3 hours. The exam is closed book, closed notes except your three one-page crib sheets. Please use non-programmable calculators
More informationAndrew/CS ID: Midterm Solutions, Fall 2006
Name: Andrew/CS ID: 15-780 Midterm Solutions, Fall 2006 November 15, 2006 Place your name and your andrew/cs email address on the front page. The exam is open-book, open-notes, no electronics other than
More informationMidterm exam CS 189/289, Fall 2015
Midterm exam CS 189/289, Fall 2015 You have 80 minutes for the exam. Total 100 points: 1. True/False: 36 points (18 questions, 2 points each). 2. Multiple-choice questions: 24 points (8 questions, 3 points
More informationCS 188: Artificial Intelligence Spring Announcements
CS 188: Artificial Intelligence Spring 2011 Lecture 16: Bayes Nets IV Inference 3/28/2011 Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements
More informationIntroduction to Machine Learning Midterm Exam
10-701 Introduction to Machine Learning Midterm Exam Instructors: Eric Xing, Ziv Bar-Joseph 17 November, 2015 There are 11 questions, for a total of 100 points. This exam is open book, open notes, but
More informationBayes Net Representation. CS 188: Artificial Intelligence. Approximate Inference: Sampling. Variable Elimination. Sampling.
188: Artificial Intelligence Bayes Nets: ampling Bayes Net epresentation A directed, acyclic graph, one node per random variable A conditional probability table (PT) for each node A collection of distributions
More informationFinal Exam. December 11 th, This exam booklet contains five problems, out of which you are expected to answer four problems of your choice.
CS446: Machine Learning Fall 2012 Final Exam December 11 th, 2012 This is a closed book exam. Everything you need in order to solve the problems is supplied in the body of this exam. Note that there is
More informationFinal. Introduction to Artificial Intelligence. CS 188 Summer 2014
S 188 Summer 2014 Introduction to rtificial Intelligence Final You have approximately 2 hours and 50 minutes. The exam is closed book, closed notes except your two-page crib sheet. Mark your answers ON
More information10-701/15-781, Machine Learning: Homework 4
10-701/15-781, Machine Learning: Homewor 4 Aarti Singh Carnegie Mellon University ˆ The assignment is due at 10:30 am beginning of class on Mon, Nov 15, 2010. ˆ Separate you answers into five parts, one
More informationFinal Exam, Fall 2002
15-781 Final Exam, Fall 22 1. Write your name and your andrew email address below. Name: Andrew ID: 2. There should be 17 pages in this exam (excluding this cover sheet). 3. If you need more room to work
More informationMath 110 (Fall 2018) Midterm II (Monday October 29, 12:10-1:00)
Math 110 (Fall 2018) Midterm II (Monday October 29, 12:10-1:00) Name: SID: Please write clearly and legibly. Justify your answers. Partial credits may be given to Problems 2, 3, 4, and 5. The last sheet
More informationUNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014
UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014 Exam policy: This exam allows two one-page, two-sided cheat sheets (i.e. 4 sides); No other materials. Time: 2 hours. Be sure to write
More informationLast/Family Name First/Given Name Seat # Exam # Failure to follow the instructions below will constitute a breach of the Honor Code:
Math 21, Winter 2018 Schaeffer/Solis Midterm Exam 2 (February 28th, 2018) Last/Family Name First/Given Name Seat # Exam # Failure to follow the instructions below will constitute a breach of the Honor
More informationFINAL EXAM: FALL 2013 CS 6375 INSTRUCTOR: VIBHAV GOGATE
FINAL EXAM: FALL 2013 CS 6375 INSTRUCTOR: VIBHAV GOGATE You are allowed a two-page cheat sheet. You are also allowed to use a calculator. Answer the questions in the spaces provided on the question sheets.
More informationArtificial Intelligence Bayes Nets: Independence
Artificial Intelligence Bayes Nets: Independence Instructors: David Suter and Qince Li Course Delivered @ Harbin Institute of Technology [Many slides adapted from those created by Dan Klein and Pieter
More informationMath 115 Practice for Exam 2
Math 115 Practice for Exam Generated October 30, 017 Name: SOLUTIONS Instructor: Section Number: 1. This exam has 5 questions. Note that the problems are not of equal difficulty, so you may want to skip
More informationIntroduction to Spring 2009 Artificial Intelligence Midterm Solutions
S 88 Introduction to Spring 009 rtificial Intelligence Midterm Solutions. (6 points) True/False For the following questions, a correct answer is worth points, no answer is worth point, and an incorrect
More informationIntroduction to Fall 2011 Artificial Intelligence Final Exam
CS 188 Introduction to Fall 2011 rtificial Intelligence Final Exam INSTRUCTIONS You have 3 hours. The exam is closed book, closed notes except two pages of crib sheets. Please use non-programmable calculators
More informationCS221 Practice Midterm
CS221 Practice Midterm Autumn 2012 1 ther Midterms The following pages are excerpts from similar classes midterms. The content is similar to what we ve been covering this quarter, so that it should be
More information2. (10 pts) How many vectors are in the null space of the matrix A = 0 1 1? (i). Zero. (iv). Three. (ii). One. (v).
Exam 3 MAS 3105 Applied Linear Algebra, Spring 2018 (Clearly!) Print Name: Apr 10, 2018 Read all of what follows carefully before starting! 1. This test has 7 problems and is worth 110 points. Please be
More informationCS 188 Introduction to AI Fall 2005 Stuart Russell Final
NAME: SID#: Section: 1 CS 188 Introduction to AI all 2005 Stuart Russell inal You have 2 hours and 50 minutes. he exam is open-book, open-notes. 100 points total. Panic not. Mark your answers ON HE EXAM
More informationFinal exam of ECE 457 Applied Artificial Intelligence for the Spring term 2007.
Spring 2007 / Page 1 Final exam of ECE 457 Applied Artificial Intelligence for the Spring term 2007. Don t panic. Be sure to write your name and student ID number on every page of the exam. The only materials
More informationTopic Models. Brandon Malone. February 20, Latent Dirichlet Allocation Success Stories Wrap-up
Much of this material is adapted from Blei 2003. Many of the images were taken from the Internet February 20, 2014 Suppose we have a large number of books. Each is about several unknown topics. How can
More informationCS 188: Artificial Intelligence. Machine Learning
CS 188: Artificial Intelligence Review of Machine Learning (ML) DISCLAIMER: It is insufficient to simply study these slides, they are merely meant as a quick refresher of the high-level ideas covered.
More informationAE = q < H(p < ) + (1 q < )H(p > ) H(p) = p lg(p) (1 p) lg(1 p)
1 Decision Trees (13 pts) Data points are: Negative: (-1, 0) (2, 1) (2, -2) Positive: (0, 0) (1, 0) Construct a decision tree using the algorithm described in the notes for the data above. 1. Show the
More informationCS 188: Artificial Intelligence Spring Announcements
CS 188: Artificial Intelligence Spring 2011 Lecture 14: Bayes Nets II Independence 3/9/2011 Pieter Abbeel UC Berkeley Many slides over the course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements
More informationName: Josh Hug Your EdX Login: SID: Name of person to left: Josh Hug Exam Room: Josh Hug Name of person to right: Josh Hug Primary TA: Adam Janin
UC Berkeley Computer Science CS188: Introduction to Artificial Intelligence Josh Hug and Adam Janin Midterm II, Fall 2016 Solutions This test has 7 questions worth a total of 100 points, to be completed
More informationCS446: Machine Learning Fall Final Exam. December 6 th, 2016
CS446: Machine Learning Fall 2016 Final Exam December 6 th, 2016 This is a closed book exam. Everything you need in order to solve the problems is supplied in the body of this exam. This exam booklet contains
More information10-701/ Machine Learning - Midterm Exam, Fall 2010
10-701/15-781 Machine Learning - Midterm Exam, Fall 2010 Aarti Singh Carnegie Mellon University 1. Personal info: Name: Andrew account: E-mail address: 2. There should be 15 numbered pages in this exam
More informationCOS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference
COS402- Artificial Intelligence Fall 2015 Lecture 10: Bayesian Networks & Exact Inference Outline Logical inference and probabilistic inference Independence and conditional independence Bayes Nets Semantics
More information